5,125 research outputs found
Heavy Quark Effective Theory on the Light Front
The light-front heavy quark effective theory is derived to all orders in
. In the limit , the theory exhibits the familiar
heavy quark spin-flavor symmetry. This new formalism permits a straightforward
canonical quantization to all orders in ; moreover, higher order terms
have rather simple operator structures. The light-front heavy quark effective
theory can serve as an useful framework for the study of non-perturbative QCD
dynamics of heavy hadron bound states.Comment: 11 pages, revtex, no figure
Soft Methodology for Cost-and-error Sensitive Classification
Many real-world data mining applications need varying cost for different
types of classification errors and thus call for cost-sensitive classification
algorithms. Existing algorithms for cost-sensitive classification are
successful in terms of minimizing the cost, but can result in a high error rate
as the trade-off. The high error rate holds back the practical use of those
algorithms. In this paper, we propose a novel cost-sensitive classification
methodology that takes both the cost and the error rate into account. The
methodology, called soft cost-sensitive classification, is established from a
multicriteria optimization problem of the cost and the error rate, and can be
viewed as regularizing cost-sensitive classification with the error rate. The
simple methodology allows immediate improvements of existing cost-sensitive
classification algorithms. Experiments on the benchmark and the real-world data
sets show that our proposed methodology indeed achieves lower test error rates
and similar (sometimes lower) test costs than existing cost-sensitive
classification algorithms. We also demonstrate that the methodology can be
extended for considering the weighted error rate instead of the original error
rate. This extension is useful for tackling unbalanced classification problems.Comment: A shorter version appeared in KDD '1
Hidden Trends in 90 Years of Harvard Business Review
In this paper, we demonstrate and discuss results of our mining the abstracts
of the publications in Harvard Business Review between 1922 and 2012.
Techniques for computing n-grams, collocations, basic sentiment analysis, and
named-entity recognition were employed to uncover trends hidden in the
abstracts. We present findings about international relationships, sentiment in
HBR's abstracts, important international companies, influential technological
inventions, renown researchers in management theories, US presidents via
chronological analyses.Comment: 6 pages, 14 figures, Proceedings of 2012 International Conference on
Technologies and Applications of Artificial Intelligenc
Identifiability of the Simplex Volume Minimization Criterion for Blind Hyperspectral Unmixing: The No Pure-Pixel Case
In blind hyperspectral unmixing (HU), the pure-pixel assumption is well-known
to be powerful in enabling simple and effective blind HU solutions. However,
the pure-pixel assumption is not always satisfied in an exact sense, especially
for scenarios where pixels are heavily mixed. In the no pure-pixel case, a good
blind HU approach to consider is the minimum volume enclosing simplex (MVES).
Empirical experience has suggested that MVES algorithms can perform well
without pure pixels, although it was not totally clear why this is true from a
theoretical viewpoint. This paper aims to address the latter issue. We develop
an analysis framework wherein the perfect endmember identifiability of MVES is
studied under the noiseless case. We prove that MVES is indeed robust against
lack of pure pixels, as long as the pixels do not get too heavily mixed and too
asymmetrically spread. The theoretical results are verified by numerical
simulations
- …